This repository has been archived by the owner on Sep 12, 2022. It is now read-only.
Add blueprint parameters as env variables in before/after scripts #52
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This is exposes parameters to the before/after scripts.
Previously, if you wanted to pass envs, vars, params to a before/after script you need to do something like
Then in your
zip-and-upload-to-s3.sh
script you need to use$1
,$2
, etcThis is getting messy when my lambda functions start requiring a lot of blueprint vars, resources, etc.
With this PR all blueprint params are added to the environment
And I can access them in the script like
This is really handy, for example, when you have a lot of DynamoDB table names that you need to pass to your script. You can use a function like
to replace all
{{PARAM}}
style tags with their values as they're added.